Processing Multiple Signals with input_variants()
When you have multiple signals (from different files, sensors, or test scenarios) that need the same processing, use `.input_variants()` to run them all through the same graph.
Method 1: Process Multiple Input Signals
from sigexec import Graph
from sigexec.blocks import RangeCompress, DopplerCompress
from sigexec.core.data import GraphData
# Load or generate different signals
signal_dataset_a = load_signal("dataset_a.bin")
signal_dataset_b = load_signal("dataset_b.bin")
signal_dataset_c = load_signal("dataset_c.bin")
# Process all three through the same graph
results = (Graph()
.input_variants([signal_dataset_a, signal_dataset_b, signal_dataset_c],
names=['Dataset A', 'Dataset B', 'Dataset C'])
.add(RangeCompress(window='hamming'))
.add(DopplerCompress(window='hann'))
.run()
)
# Results contains one (params, result) tuple for each input signal
for params, result in results:
dataset_name = params['variant'][0]
print(f"{dataset_name}: peak at {find_peak(result)}")
Live Example: Three Different Target Scenarios
Processing three radar scenarios with different target parameters through the same range/Doppler compression graph.
Method 2: Lazy Loading with Variants
For large datasets or many files, you don't want to load everything into memory at once. Use `.variants()` with a loader factory to load data lazily during graph execution.
from sigexec import Graph
from sigexec.core.data import GraphData
import numpy as np
# Create a loader factory - data is loaded only when the variant executes
def make_loader(filename):
def load(_):
data = np.load(filename) # Loaded on demand, not upfront
return GraphData(data, metadata={'sample_rate': 20e6})
return load
# Process multiple files through the same graph
# Each file is loaded only when needed, one at a time
results = (Graph()
.variants(make_loader,
['dataset_a.npy', 'dataset_b.npy', 'dataset_c.npy'],
names=['Dataset A', 'Dataset B', 'Dataset C'])
.add(StackPulses())
.add(RangeCompress(window='hamming'))
.add(DopplerCompress(window='hann'))
.run()
)
# Results contains one (params, result) tuple for each file
for params, result in results:
dataset_name = params['variant'][0]
print(f"Processed {dataset_name}")
Live Example: Lazy Loading from Saved Files
First, generate and save three different target scenarios to files:
✓ Saved 3 signal files to temporary directory
\nNow load and process them lazily - one file at a time:
Method 3: Combine Lazy Loading with Processing Variants
Combine lazy-loaded data variants with processing parameter variants to explore the full cartesian product without loading all data into memory at once.
from sigexec import Graph
# Loader factory
def make_loader(filename):
def load(_):
data = np.load(filename)
return GraphData(data, metadata={'sample_rate': 20e6})
return load
# 3 files × 2 range windows × 2 Doppler windows = 12 total combinations
# But only one file is in memory at a time!
results = (Graph()
.variants(make_loader,
['sig_a.npy', 'sig_b.npy', 'sig_c.npy'],
names=['Signal A', 'Signal B', 'Signal C'])
.add(StackPulses())
.variants(lambda w: RangeCompress(window=w),
['hamming', 'blackman'],
names=['Hamming', 'Blackman'])
.variants(lambda w: DopplerCompress(window=w),
['hann', 'hamming'],
names=['Hann', 'Hamming'])
.run()
)
# Access all three levels of variants
for params, result in results:
signal_name = params['variant'][0]
range_window = params['variant'][1]
doppler_window = params['variant'][2]
print(f"{signal_name} + Range:{range_window} + Doppler:{doppler_window}")
Live Example: 2 Files × 2 Range × 2 Doppler = 8 Combinations
Generating and saving 2 signals, then loading lazily with processing variants:
| Signal | Range Window | Doppler Window | Peak Value |
|---|---|---|---|
| /tmp/tmp774dua64/target1.npz | hamming | hann | 600.8 |
| /tmp/tmp774dua64/target1.npz | hamming | hamming | 648.9 |
| /tmp/tmp774dua64/target1.npz | blackman | hann | 466.6 |
| /tmp/tmp774dua64/target1.npz | blackman | hamming | 503.9 |
| /tmp/tmp774dua64/target2.npz | hamming | hann | 610.4 |
| /tmp/tmp774dua64/target2.npz | hamming | hamming | 662.3 |
| /tmp/tmp774dua64/target2.npz | blackman | hann | 473.7 |
| /tmp/tmp774dua64/target2.npz | blackman | hamming | 514.0 |
Summary
Key benefits of lazy loading with `.variants()`: 1. **Memory Efficient**: Only one signal in memory at a time 2. **Scalable**: Process hundreds of files without memory issues 3. **Flexible**: Combine with processing variants for full exploration 4. **Consistent Processing**: Same graph applied to all data 5. **Easy Pattern**: Just wrap your loader in a factory function Pattern to remember: ```python def make_loader(filename): def load(_): # Load happens here, during execution data = load_from_somewhere(filename) return GraphData(data, sample_rate=...) return load results = Graph().variants(make_loader, file_list, names=...).add(...).run() ``` Use cases: - Processing large datasets that don't fit in memory - Batch processing many files from disk or network - Testing algorithms across multiple scenarios - Comparing data from different sensors or time periods